# Multi-source dataset
7B DPO Alpha
A 7B-parameter causal language model trained on multi-source datasets, optimized with DPO, supporting Chinese and English text generation tasks
Large Language Model
Transformers Supports Multiple Languages

7
CausalLM
131
54
Robbert 2022 Dutch Sentence Transformers
Apache-2.0
A Dutch sentence embedding model based on the RobBERT model, which maps text to a 768-dimensional vector space, suitable for semantic search and text similarity calculation
Text Embedding
Transformers Other

R
NetherlandsForensicInstitute
8,375
12
Slovakbert
MIT
A pretrained model based on Slovak language, using masked language modeling (MLM) objective, case-sensitive.
Large Language Model
Transformers Other

S
gerulata
5,009
23
Wav2vec2 Xls R 300m Hebrew
This is a Hebrew automatic speech recognition model fine-tuned based on the facebook/wav2vec2-xls-r-300m model, optimized for performance through two-stage training on small-scale and large-scale datasets.
Speech Recognition
Transformers Other

W
imvladikon
1.2M
4
Featured Recommended AI Models